# High-dimensional vectors
Qwen3 Embedding 8B GGUF
Apache-2.0
Qwen3-Embedding-8B is the latest proprietary model in the Qwen family, designed for text embedding and ranking tasks. It is built on the dense base model of the Qwen3 series and has excellent multilingual capabilities and long text understanding capabilities.
Text Embedding
Q
Mungert
612
1
Plamo Embedding 1b
Apache-2.0
PLaMo-Embedding-1B is a Japanese text embedding model developed by Preferred Networks, demonstrating outstanding performance in Japanese text embedding benchmarks
Text Embedding
Transformers Japanese

P
pfnet
33.48k
25
Langcache Embed V1
This is a sentence-transformers model fine-tuned from Alibaba NLP/gte-modernbert-base, designed for semantic text similarity computation to implement semantic caching functionality.
Text Embedding
L
redis
2,138
1
Vietnamese Embedding
Vietnamese embedding model fine-tuned on BGE-M3, enhancing Vietnamese retrieval capabilities
Text Embedding Other
V
AITeamVN
14.26k
26
Mmlw Retrieval E5 Base
Apache-2.0
MMLW (I Must Get Better Messages) is a Polish neural text encoder optimized for information retrieval tasks, capable of converting queries and passages into 768-dimensional vectors.
Text Embedding
Transformers Other

M
sdadas
144
1
All Mpnet Base V2
MIT
This is a sentence embedding model based on the MPNet architecture, capable of mapping text to a 768-dimensional vector space, suitable for semantic search and sentence similarity tasks.
Text Embedding English
A
navteca
14
1
Featured Recommended AI Models